Feature representations using the reflected rectified linear unit (RReLU) activation

نویسندگان
چکیده

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Deep Learning using Rectified Linear Units (ReLU)

We introduce the use of rectified linear units (ReLU) as the classification function in a deep neural network (DNN). Conventionally, ReLU is used as an activation function in DNNs, with Softmax function as their classification function. However, there have been several studies on using a classification function other than Softmax, and this study is an addition to those. We accomplish this by ta...

متن کامل

Deep Learning with S-Shaped Rectified Linear Activation Units

Rectified linear activation units are important components for state-of-the-art deep convolutional networks. In this paper, we propose a novel S-shaped rectified linear activation unit (SReLU) to learn both convex and non-convex functions, imitating the multiple function forms given by the two fundamental laws, namely the Webner-Fechner law and the Stevens law, in psychophysics and neural scien...

متن کامل

Empirical Evaluation of Rectified Activations in Convolutional Network

In this paper we investigate the performance of different types of rectified activation functions in convolutional neural network: standard rectified linear unit (ReLU), leaky rectified linear unit (Leaky ReLU), parametric rectified linear unit (PReLU) and a new randomized leaky rectified linear units (RReLU). We evaluate these activation function on standard image classification task. Our expe...

متن کامل

DReLUs: Dual Rectified Linear Units

Rectified Linear Units (ReLUs) are widely used in feed-forward neural networks, and in convolutional neural networks in particular. However, they can be rarely found in recurrent neural networks due to the unboundedness and the positive image of the rectified linear activation function. In this paper, we introduce Dual Rectified Linear Units (DReLUs), a novel type of rectified unit that comes w...

متن کامل

Pattern Classification Using Rectified Nearest Feature Line Segment

This paper proposes a new classification method termed Rectified Nearest Feature Line Segment (RNFLS). It overcomes the drawbacks of the original Nearest Feature Line (NFL) classifier and possesses a novel property that centralizes the probability density of the initial sample distribution, which significantly enhances the classification ability. Another remarkable merit is that RNFLS is applic...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Big Data Mining and Analytics

سال: 2020

ISSN: 2096-0654

DOI: 10.26599/bdma.2019.9020024